240 research outputs found

    Ultrasound as non-destructive evaluation tool

    Get PDF

    Dual-IRT-GAN:A defect-aware deep adversarial network to perform super-resolution tasks in infrared thermographic inspection

    Get PDF
    InfraRed Thermography (IRT) is a valuable diagnostic tool for detecting defects in fiber-reinforced polymers in a non-destructive manner through the measurement of surface temperature distribution. Yet, thermal cameras typically have a low native spatial resolution resulting in a blurry and low-quality thermal image sequence. This study proposes a defect-aware Generative Adversarial Network (GAN) framework, termed Dual-IRT-GAN, in order to simultaneously perform Super-Resolution (SR) and defect detection tasks in infrared thermography. Furthermore, the visibility of defective regions in generated high-resolution images are enhanced by leveraging defect-aware attention maps from segmented defect images. Following a series of augmentation techniques and a second-order degradation process, the proposed Dual-IRT-GAN model is trained on an extensive numerically generated thermographic dataset of composite materials with various defect types, sizes and depts. The high inference performance of the virtually trained Dual-IRT-GAN is demonstrated on several experimental thermographic datasets which were obtained from composite coupon specimens with various defect types, sizes, and depths, as well as from aircraft stiffened composite panels having real (production) defects.</p

    Defect-aware Super-resolution Thermography by Adversarial Learning

    Get PDF
    nfrared thermography is a valuable non-destructive tool for inspection of materials. It measures the surface temperature evolution, from which hidden defects may be detected. Yet, thermal cameras typically have a low native spatial resolution resulting in a blurry and low-quality thermal image sequence and videos. In this study, a novel adversarial deep learning framework, called Dual-IRT-GAN, is proposed for performing super-resolution tasks. The proposed Dual-IRT-GAN attempts to achieve the objective of improving local texture details, as well as highlighting defective regions. Technically speaking, the proposed model consists of two modules SEGnet and SRnet that carry out defect detection and super resolution tasks, respectively. By leveraging the defect information from SEGnet, SRnet is capable of generating plausible high-resolution thermal images with an enhanced focus on defect regions. The generated high-resolution images are then delivered to the discriminator for adversarial training using GAN's framework. The proposed Dual-IRT-GAN model, which is trained on an exclusive virtual dataset, is demonstrated on experimental thermographic data obtained from fiber reinforced polymers having a variety of defect types, sizes, and depths. The obtained results show its high performance in maintaining background color consistency and removing undesired noise, and in highlighting defect zones with finer detailed textures in high-resolution

    Defect-aware Super-resolution Thermography by Adversarial Learning

    Get PDF
    nfrared thermography is a valuable non-destructive tool for inspection of materials. It measures the surface temperature evolution, from which hidden defects may be detected. Yet, thermal cameras typically have a low native spatial resolution resulting in a blurry and low-quality thermal image sequence and videos. In this study, a novel adversarial deep learning framework, called Dual-IRT-GAN, is proposed for performing super-resolution tasks. The proposed Dual-IRT-GAN attempts to achieve the objective of improving local texture details, as well as highlighting defective regions. Technically speaking, the proposed model consists of two modules SEGnet and SRnet that carry out defect detection and super resolution tasks, respectively. By leveraging the defect information from SEGnet, SRnet is capable of generating plausible high-resolution thermal images with an enhanced focus on defect regions. The generated high-resolution images are then delivered to the discriminator for adversarial training using GAN's framework. The proposed Dual-IRT-GAN model, which is trained on an exclusive virtual dataset, is demonstrated on experimental thermographic data obtained from fiber reinforced polymers having a variety of defect types, sizes, and depths. The obtained results show its high performance in maintaining background color consistency and removing undesired noise, and in highlighting defect zones with finer detailed textures in high-resolution

    Defect-aware Super-resolution Thermography by Adversarial Learning

    Get PDF
    nfrared thermography is a valuable non-destructive tool for inspection of materials. It measures the surface temperature evolution, from which hidden defects may be detected. Yet, thermal cameras typically have a low native spatial resolution resulting in a blurry and low-quality thermal image sequence and videos. In this study, a novel adversarial deep learning framework, called Dual-IRT-GAN, is proposed for performing super-resolution tasks. The proposed Dual-IRT-GAN attempts to achieve the objective of improving local texture details, as well as highlighting defective regions. Technically speaking, the proposed model consists of two modules SEGnet and SRnet that carry out defect detection and super resolution tasks, respectively. By leveraging the defect information from SEGnet, SRnet is capable of generating plausible high-resolution thermal images with an enhanced focus on defect regions. The generated high-resolution images are then delivered to the discriminator for adversarial training using GAN's framework. The proposed Dual-IRT-GAN model, which is trained on an exclusive virtual dataset, is demonstrated on experimental thermographic data obtained from fiber reinforced polymers having a variety of defect types, sizes, and depths. The obtained results show its high performance in maintaining background color consistency and removing undesired noise, and in highlighting defect zones with finer detailed textures in high-resolution

    Seeing (ultra)sound in real-time through the Acousto-PiezoLuminescent lens

    Get PDF
    In this contribution, we focus on a recently developed piezoluminescent phosphor BaSi2O2N2:Eu (BaSiON), and report on Acoustically induced PiezoLuminescence (APL). Insonification of the BaSiON phosphor with (ultra)sound waves leads to intense light emission patterns which are clearly visible by the bare eye. The emitted light intensity has been measured with a calibrated photometer revealing it is directly proportional to the applied acoustic power. As such, APL can be used to devise a simple but effective acoustic power sensor. Further, the emitted APL light pattern has a specific geometrical shape which we successfully linked to the pressure field of the incident (ultra)sonic wave. This is explicitly demonstrated for an ultrasonic (f = 3.3 MHz) transducer. By varying the insonification distance (from near- to far-field), multiple 2D slices of the transducer's radiation field light up on the BaSiON phosphor plate. By simply photographing these light patterns, and stacking them one after another, the 3D spatial radiation field of the ultrasonic transducer was reconstructed. Good agreement was found with both classical scanning hydrophone experiments and simulations. Recently we found that APL can also be activated by acoustic waves in the kHz range, thus covering a wide frequency range. Some first preliminary results are shown
    corecore